Tiffany Chan

Neural Network Assignment - Bank Churn Assignment

In [ ]:
#Install tensorflow
!pip install tensorflow==2.0
Collecting tensorflow==2.0
  Downloading https://files.pythonhosted.org/packages/46/0f/7bd55361168bb32796b360ad15a25de6966c9c1beb58a8e30c01c8279862/tensorflow-2.0.0-cp36-cp36m-manylinux2010_x86_64.whl (86.3MB)
     |████████████████████████████████| 86.3MB 54kB/s 
Requirement already satisfied: absl-py>=0.7.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (0.10.0)
Collecting tensorflow-estimator<2.1.0,>=2.0.0
  Downloading https://files.pythonhosted.org/packages/fc/08/8b927337b7019c374719145d1dceba21a8bb909b93b1ad6f8fb7d22c1ca1/tensorflow_estimator-2.0.1-py2.py3-none-any.whl (449kB)
     |████████████████████████████████| 450kB 43.7MB/s 
Requirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (3.3.0)
Requirement already satisfied: grpcio>=1.8.6 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (1.32.0)
Requirement already satisfied: wrapt>=1.11.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (1.12.1)
Collecting keras-applications>=1.0.8
  Downloading https://files.pythonhosted.org/packages/71/e3/19762fdfc62877ae9102edf6342d71b28fbfd9dea3d2f96a882ce099b03f/Keras_Applications-1.0.8-py3-none-any.whl (50kB)
     |████████████████████████████████| 51kB 5.9MB/s 
Requirement already satisfied: google-pasta>=0.1.6 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (0.2.0)
Collecting gast==0.2.2
  Downloading https://files.pythonhosted.org/packages/4e/35/11749bf99b2d4e3cceb4d55ca22590b0d7c2c62b9de38ac4a4a7f4687421/gast-0.2.2.tar.gz
Requirement already satisfied: numpy<2.0,>=1.16.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (1.19.5)
Requirement already satisfied: keras-preprocessing>=1.0.5 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (1.1.2)
Requirement already satisfied: protobuf>=3.6.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (3.12.4)
Collecting tensorboard<2.1.0,>=2.0.0
  Downloading https://files.pythonhosted.org/packages/76/54/99b9d5d52d5cb732f099baaaf7740403e83fe6b0cedde940fabd2b13d75a/tensorboard-2.0.2-py3-none-any.whl (3.8MB)
     |████████████████████████████████| 3.8MB 41.7MB/s 
Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (1.1.0)
Requirement already satisfied: astor>=0.6.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (0.8.1)
Requirement already satisfied: wheel>=0.26 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (0.36.2)
Requirement already satisfied: six>=1.10.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow==2.0) (1.15.0)
Requirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (from keras-applications>=1.0.8->tensorflow==2.0) (2.10.0)
Requirement already satisfied: setuptools in /usr/local/lib/python3.6/dist-packages (from protobuf>=3.6.1->tensorflow==2.0) (53.0.0)
Requirement already satisfied: google-auth<2,>=1.6.3 in /usr/local/lib/python3.6/dist-packages (from tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (1.25.0)
Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.6/dist-packages (from tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (1.0.1)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.6/dist-packages (from tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (0.4.2)
Requirement already satisfied: requests<3,>=2.21.0 in /usr/local/lib/python3.6/dist-packages (from tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (2.23.0)
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.6/dist-packages (from tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (3.3.3)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.6/dist-packages (from google-auth<2,>=1.6.3->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (4.2.1)
Requirement already satisfied: rsa<5,>=3.1.4; python_version >= "3.6" in /usr/local/lib/python3.6/dist-packages (from google-auth<2,>=1.6.3->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (4.7)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.6/dist-packages (from google-auth<2,>=1.6.3->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (0.2.8)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.6/dist-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (1.3.0)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests<3,>=2.21.0->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (1.24.3)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests<3,>=2.21.0->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (2.10)
Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests<3,>=2.21.0->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (3.0.4)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests<3,>=2.21.0->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (2020.12.5)
Requirement already satisfied: importlib-metadata; python_version < "3.8" in /usr/local/lib/python3.6/dist-packages (from markdown>=2.6.8->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (3.4.0)
Requirement already satisfied: pyasn1>=0.1.3 in /usr/local/lib/python3.6/dist-packages (from rsa<5,>=3.1.4; python_version >= "3.6"->google-auth<2,>=1.6.3->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (0.4.8)
Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.6/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (3.1.0)
Requirement already satisfied: typing-extensions>=3.6.4; python_version < "3.8" in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < "3.8"->markdown>=2.6.8->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (3.7.4.3)
Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < "3.8"->markdown>=2.6.8->tensorboard<2.1.0,>=2.0.0->tensorflow==2.0) (3.4.0)
Building wheels for collected packages: gast
  Building wheel for gast (setup.py) ... done
  Created wheel for gast: filename=gast-0.2.2-cp36-none-any.whl size=7540 sha256=e1b0fe348366b7379dff68a3bd29c3f1d4bc3159c1e28912020e9c634a5b0cfd
  Stored in directory: /root/.cache/pip/wheels/5c/2e/7e/a1d4d4fcebe6c381f378ce7743a3ced3699feb89bcfbdadadd
Successfully built gast
ERROR: tensorflow-probability 0.12.1 has requirement gast>=0.3.2, but you'll have gast 0.2.2 which is incompatible.
Installing collected packages: tensorflow-estimator, keras-applications, gast, tensorboard, tensorflow
  Found existing installation: tensorflow-estimator 2.4.0
    Uninstalling tensorflow-estimator-2.4.0:
      Successfully uninstalled tensorflow-estimator-2.4.0
  Found existing installation: gast 0.3.3
    Uninstalling gast-0.3.3:
      Successfully uninstalled gast-0.3.3
  Found existing installation: tensorboard 2.4.1
    Uninstalling tensorboard-2.4.1:
      Successfully uninstalled tensorboard-2.4.1
  Found existing installation: tensorflow 2.4.1
    Uninstalling tensorflow-2.4.1:
      Successfully uninstalled tensorflow-2.4.1
Successfully installed gast-0.2.2 keras-applications-1.0.8 tensorboard-2.0.2 tensorflow-2.0.0 tensorflow-estimator-2.0.1
In [ ]:
#Let's import all our regular libraries

import numpy as np
import pandas as pd
import seaborn as sns
from sklearn import metrics
import matplotlib.pyplot as plt
%matplotlib inline
In [ ]:
#Let's import the tensorflow, and accompanying components

import tensorflow as tf
from tensorflow.keras import Sequential                      #For making neural network models
from tensorflow.keras.layers import Dense                    #For making the hidden layers
from tensorflow.keras.layers import Conv1D,Flatten           #For flattening the first layer

1. Read the dataset

In [ ]:
from google.colab import files
uploaded = files.upload()
Upload widget is only available when the cell has been executed in the current browser session. Please rerun this cell to enable.
Saving bank.csv to bank.csv
In [ ]:
import io
df = pd.read_csv(io.BytesIO(uploaded['bank.csv']))
In [ ]:
#Let's look to see if there are any duplicates just in case.
from collections import Counter  #Let's use this to check for duplicates
id = df['CustomerId']            #Let's compaare the customer ids
d =  Counter(id)  
res = [k for k, v in d.items() if v > 1]
print(res)
[]

There are no duplicates in the dataset. Let's proceed to delete the features with unique identifiers.

2. Drop the columns which are unique for all users like IDs (5 points)

In [ ]:
#Let's drop the unique identifier columns: CustomerID, Surname and RowNumber
df = df.drop("CustomerId" , axis=1)            
df = df.drop("Surname", axis = 1)
df = df.drop("RowNumber", axis = 1)
df.head()                              #Checking to see if the unique identifiers are deleted.
Out[ ]:
CreditScore Geography Gender Age Tenure Balance NumOfProducts HasCrCard IsActiveMember EstimatedSalary Exited
0 619 France Female 42 2 0.00 1 1 1 101348.88 1
1 608 Spain Female 41 1 83807.86 1 0 1 112542.58 0
2 502 France Female 42 8 159660.80 3 1 0 113931.57 1
3 699 France Female 39 1 0.00 2 0 0 93826.63 0
4 850 Spain Female 43 2 125510.82 1 1 1 79084.10 0
In [ ]:
#Descriptive Statistics
df.describe()                      
Out[ ]:
CreditScore Age Tenure Balance NumOfProducts HasCrCard IsActiveMember EstimatedSalary Exited
count 10000.000000 10000.000000 10000.000000 10000.000000 10000.000000 10000.00000 10000.000000 10000.000000 10000.000000
mean 650.528800 38.921800 5.012800 76485.889288 1.530200 0.70550 0.515100 100090.239881 0.203700
std 96.653299 10.487806 2.892174 62397.405202 0.581654 0.45584 0.499797 57510.492818 0.402769
min 350.000000 18.000000 0.000000 0.000000 1.000000 0.00000 0.000000 11.580000 0.000000
25% 584.000000 32.000000 3.000000 0.000000 1.000000 0.00000 0.000000 51002.110000 0.000000
50% 652.000000 37.000000 5.000000 97198.540000 1.000000 1.00000 1.000000 100193.915000 0.000000
75% 718.000000 44.000000 7.000000 127644.240000 2.000000 1.00000 1.000000 149388.247500 0.000000
max 850.000000 92.000000 10.000000 250898.090000 4.000000 1.00000 1.000000 199992.480000 1.000000

For descriptive statistics above, it is important to look at the continuous variables because trying to understand these numbers for the categorical variables would not make sense. It is obvious that age will be rightly skewed, especially because the mean is around 39, and the max is 92.

For the purpose of this exercise, examining these variables bivariately against the 'Exited' target variable may generate more more understanding.

In [ ]:
#Finding out if there are any NAs and null values. This is to see if we need to impute any null values.
print(df.isnull().sum())
CreditScore        0
Geography          0
Gender             0
Age                0
Tenure             0
Balance            0
NumOfProducts      0
HasCrCard          0
IsActiveMember     0
EstimatedSalary    0
Exited             0
dtype: int64
In [ ]:
#Let's look at the dimensions of the data.
df.shape
Out[ ]:
(10000, 11)
In [ ]:
#Let's look at what the dtype is for each feature.
df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 10000 entries, 0 to 9999
Data columns (total 11 columns):
 #   Column           Non-Null Count  Dtype  
---  ------           --------------  -----  
 0   CreditScore      10000 non-null  int64  
 1   Geography        10000 non-null  object 
 2   Gender           10000 non-null  object 
 3   Age              10000 non-null  int64  
 4   Tenure           10000 non-null  int64  
 5   Balance          10000 non-null  float64
 6   NumOfProducts    10000 non-null  int64  
 7   HasCrCard        10000 non-null  int64  
 8   IsActiveMember   10000 non-null  int64  
 9   EstimatedSalary  10000 non-null  float64
 10  Exited           10000 non-null  int64  
dtypes: float64(2), int64(7), object(2)
memory usage: 859.5+ KB
In [ ]:
#Let's look at the frequencies for categorical variables (univariate analysis). 
#Examining the categorical variables.
print("Geography Frequency")
print(df['Geography'].value_counts())
print("")
print("Gender Frequency")
print(df['Gender'].value_counts())
print("")
print("Tenure")
print(df['Tenure'].value_counts())
print("")
print("Number of Products Frequency")
print(df['NumOfProducts'].value_counts())
print("")
print("Has Credit Card Frequency")
print(df['HasCrCard'].value_counts())
print("")
print("Is An Active Member Frequency")
print(df['IsActiveMember'].value_counts())
print("")
print("Exited Frequency (Target Variable)")
df['Exited'].value_counts()
Geography Frequency
France     5014
Germany    2509
Spain      2477
Name: Geography, dtype: int64

Gender Frequency
Male      5457
Female    4543
Name: Gender, dtype: int64

Tenure
2     1048
1     1035
7     1028
8     1025
5     1012
3     1009
4      989
9      984
6      967
10     490
0      413
Name: Tenure, dtype: int64

Number of Products Frequency
1    5084
2    4590
3     266
4      60
Name: NumOfProducts, dtype: int64

Has Credit Card Frequency
1    7055
0    2945
Name: HasCrCard, dtype: int64

Is An Active Member Frequency
1    5151
0    4849
Name: IsActiveMember, dtype: int64

Exited Frequency (Target Variable)
Out[ ]:
0    7963
1    2037
Name: Exited, dtype: int64

After looking at these univariate frequencies for categorical variables, it is clear that the data is very uneven. The target variable ('Exited') is perhaps the most problematic. The number of those who chose to exit was more than 3 times (almost 4 times compared to) those who decided to stay. For this reason, if we ran the neural network, there may be a bias towards people staying.

You could see the same trend in the HasCrCard variable as well. People who have at least 1 credit card were more than 3 times than those who did not.

The geography feature is also problematic. There are more than double the amount of French customers compared to Spain and Germany.

Tenure is also not very evenly dispersed either.

In terms of the number of products the customers have, people with 1 product versus 2 products is evenly distributed enough. On the other hand, only 266 individuals had 3 items, and 60 customers have 4 products.

Gender and IsActiveMember are the only categorical variables that seem to be evenly distributed.


Some potential solutions to solve this issue:

  1. Downsample (Cut down the zeroes (0) in the target variable.) Disadvantage: Removing a lot of data and may not be able to capture relevant information that can predict the target variable outcome.

  2. Upsample (Synthetically create more ones (1) in the target variable.) Disadvantage: Overfit the model to the training data and can bomb on the test data.

  3. Use categorical_entropy for loss. Categorical_entropy is quite robust and can withstand unbalanced datasets that are biased towards one group.


Since the assignment is only asking for accuracy, it is fine to leave the data as is, because the accuracy metric is reliant mostly on predicting the majority class (in this case, it is the people who stayed with the bank). In predicting the minority class (those who actually did exit) well, examining precision, and recall metrics is important.

3. Perform bivariate analysis and give your insights from the same (5 points)

In [ ]:
sns.pairplot(data = df)
Out[ ]:
<seaborn.axisgrid.PairGrid at 0x7fe9a55cd9b0>

From the above plots, the most important are the ones where the predictive variables are plotted against the targeted variable ('Exited') because we can determine if there are strong predictor variables in the data. Looking at the pairplots, there isn't one variable that is particularly strong at predicting whether a customer will stay or leave because all data points are pretty much evenly distributed amongst those who left and those who stayed with the bank.

Despite this, there are certain conclusions we can draw, like those who have the lowest credit scores are more likely to leave. Those who are in the higher age range are more likely to stay with the bank. Those who have the highest balances (the leverage points) are more likely to leave.

Despite there being a lot of variables displaying crowded data in a line, we should not rush to eliminate these features because they may still be relationships that may not be immediately apparent with the target variable.

When it comes to multicollinearity, neural networks aren't really affected by it because of the complexity of its architecture. Neural networks are overparametized and the linear additions in each neuron from each layer originate from previous layer's inputs. So, multicollinearity, which affects simpler models and smaller sample sizes is not as influential on the make-up of the neural network. For this reason, it may not be that important to look at Pearson's correlations between different independent variables.

However, from these plots above, we can also observe the leverage points. Leverage points are very similar to outliers but are observable extremities in bivariate plots. Even if we did delete the outliers for univariate analysis, there may still be leverage points to handle.

For age x exited, and balance x exited there clearly are extremities. In this case, we need to decide whether we should delete these outliers or not. Since this is a neural network classification problem, we need to take into consideration the activation functions and which one(s) could help obtain better results.


Discussion of potential activation functions:

ReLu is an activation function that is easily influenced by outliers and extremities. So, if we want to work with ReLu, it may be better to delete the leverage points beforehand.

The Sigmoid activation function is robust and has squashing properties for extreme values. For this reason, sigmoid may be a better option if we choose not to delete those cases.

Softmax is also an activation function for classification models as well but used in the output layer to convert raw numerical calcuations from the neural network into interpretable probabilities.

4. Distinguish the feature and target set and divide the data set into training and test sets (5 points)

In [ ]:
#Let's divide the dataset into predictor variables and the target variable.

X = df.drop("Exited", axis = 1)   # This X data will be all the predictor variables. So, 'Exited' must be removed.


y= df.iloc[:,-1].values          # This is for the target variable 'Exited', which is the last column in the df dataframe. It needs to be made into an array.
In [ ]:
#Double check to see if exited is removed from X
X.head()
Out[ ]:
CreditScore Geography Gender Age Tenure Balance NumOfProducts HasCrCard IsActiveMember EstimatedSalary
0 619 France Female 42 2 0.00 1 1 1 101348.88
1 608 Spain Female 41 1 83807.86 1 0 1 112542.58
2 502 France Female 42 8 159660.80 3 1 0 113931.57
3 699 France Female 39 1 0.00 2 0 0 93826.63
4 850 Spain Female 43 2 125510.82 1 1 1 79084.10
In [ ]:
#Get Dummies for the geography categorical variable. We don't need to get dummies for the other categorical variables because they are mostly binary.
#As for tenure, it can be interpreted as a continuous variable. So, we don't really have to alter any other variable except geography.
X = pd.get_dummies(X, columns=['Geography'])
In [ ]:
#We should use a label encoder for those variables like Gender where the categories should be converted to numbers. 
from sklearn.preprocessing import LabelEncoder

for col in X.select_dtypes(include=['object']):
    encoder=LabelEncoder()
    X[col]=encoder.fit_transform(X[col])
In [ ]:
#Dividing into train and test set.

from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.2, random_state = 0)
In [ ]:
#Let's make a copy of the X_train data so that we can run one X_train on our initial model and another one on the improved model.
X_train_init = X_train           #Both are the same. I just want to run X_train_init on our preliminary model and X_train on our final model after hypertuning all our parameters.
y_train_init = y_train           #Both are the same. I just want to run y_train_init on our preliminary model and y_train on our final model after hypertuning all our parameters.

5. Normalize the train and test data (10 points)

In [ ]:
#Normalize the train data. Let's just normalize the train data for the initial model first. 
#We do not know whether using normalize or standardscaler would be better. 
#We will normalize/standardize the test data later based on whichever way yields better results.

import pandas as pd
from sklearn import preprocessing

X_train_init = preprocessing.normalize(X_train_init)               #Normalize the initial model train data.
In [ ]:
#Make sure we understand the shape of the data so that we know what to put into the input layer for dimensions.
print(X_train_init.shape)
print(y_train_init.shape)
(8000, 12)
(8000,)

6. Initialize & build the model. Identify the points of improvement and implement the same. (20)

In [ ]:
#Build the neural network framework for the initial model.

from tensorflow import keras
from tensorflow.keras.layers import Dense,Conv1D,Flatten
from tensorflow.keras.models import Sequential, Model

model = Sequential()
In [ ]:
#Let's add the layers (input, hidden and output)

model.add(Dense(100, input_shape = (12,)))            #There is no activation function here in the input layer because there is no transformation being done.
model.add(Dense(300, activation='sigmoid'))             #Let's use either one of the more common activation functions here, either ReLu or Sigmoid. Since we did not remove outliers/leverage points, we should use sigmoid because it has squashing capabilities.
model.add(Dense(1,activation='softmax'))             #Since this is a classification model, let's use softmax as activation function for this output layer.
In [ ]:
#Designate the optimizer, the metrics used for evaluation and the loss function.
model.compile(optimizer='adam',metrics=['accuracy'],loss='binary_crossentropy')          # I tried to use categorical_crossentropy here but kept getting an error. So, I changed it to binary_crossentropy because this is a binary classification problem afterall.
In [ ]:
#Training the model. Make the batch size large first, and set the amount of epochs to how many times you want the data to go through the model.
#We should also include the validation testing to ascertain we are not overfitting, which is very important.

model.fit(X_train_init,y_train_init,batch_size=100,validation_split=0.1,epochs=100)
Train on 7200 samples, validate on 800 samples
Epoch 1/100
7200/7200 [==============================] - 1s 116us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 2/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 3/100
7200/7200 [==============================] - 0s 44us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 4/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 5/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 6/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 7/100
7200/7200 [==============================] - 0s 44us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 8/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 9/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 10/100
7200/7200 [==============================] - 0s 43us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 11/100
7200/7200 [==============================] - 0s 47us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 12/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 13/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 14/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 15/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 16/100
7200/7200 [==============================] - 0s 43us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 17/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 18/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 19/100
7200/7200 [==============================] - 0s 47us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 20/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 21/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 22/100
7200/7200 [==============================] - 0s 47us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 23/100
7200/7200 [==============================] - 0s 50us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 24/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 25/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 26/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 27/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 28/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 29/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 30/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 31/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 32/100
7200/7200 [==============================] - 0s 50us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 33/100
7200/7200 [==============================] - 0s 50us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 34/100
7200/7200 [==============================] - 0s 47us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 35/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 36/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 37/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 38/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 39/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 40/100
7200/7200 [==============================] - 0s 44us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 41/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 42/100
7200/7200 [==============================] - 0s 51us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 43/100
7200/7200 [==============================] - 0s 47us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 44/100
7200/7200 [==============================] - 0s 47us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 45/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 46/100
7200/7200 [==============================] - 0s 44us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 47/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 48/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 49/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 50/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 51/100
7200/7200 [==============================] - 0s 47us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 52/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 53/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 54/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 55/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 56/100
7200/7200 [==============================] - 0s 44us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 57/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 58/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 59/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 60/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 61/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 62/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 63/100
7200/7200 [==============================] - 0s 50us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 64/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 65/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 66/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 67/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 68/100
7200/7200 [==============================] - 0s 51us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 69/100
7200/7200 [==============================] - 0s 53us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 70/100
7200/7200 [==============================] - 0s 50us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 71/100
7200/7200 [==============================] - 0s 54us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 72/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 73/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 74/100
7200/7200 [==============================] - 0s 52us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 75/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 76/100
7200/7200 [==============================] - 0s 47us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 77/100
7200/7200 [==============================] - 0s 52us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 78/100
7200/7200 [==============================] - 0s 51us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 79/100
7200/7200 [==============================] - 0s 48us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 80/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 81/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 82/100
7200/7200 [==============================] - 0s 50us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 83/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 84/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 85/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 86/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 87/100
7200/7200 [==============================] - 0s 43us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 88/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 89/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 90/100
7200/7200 [==============================] - 0s 44us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 91/100
7200/7200 [==============================] - 0s 47us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 92/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 93/100
7200/7200 [==============================] - 0s 44us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 94/100
7200/7200 [==============================] - 0s 49us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 95/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 96/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 97/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 98/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 99/100
7200/7200 [==============================] - 0s 46us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Epoch 100/100
7200/7200 [==============================] - 0s 45us/sample - loss: 12.2070 - accuracy: 0.2039 - val_loss: 12.1899 - val_accuracy: 0.2050
Out[ ]:
<tensorflow.python.keras.callbacks.History at 0x7fe984415278>

This was the first model I ran for this assignment.

As you can tell, the accuracy score is very low (approx. 0.2) for the training data. I ran many models in order to improve it. I learned that softmax is used more frequently for multi-class classification.


Changing the activation function in the output layer:

Softmax converts the raw calculations into the probabilities that compare one class to another. For binary classification neural networks, we don't normally compare groups with one class like we do in logistic regression. Sigmoid activation function yielded a far better result. There was an approximate 0.6% improvement in the accuracy.


Changing normalization to using standardscaler:

For some reason, using standardscaler improved the accuracy of the model.


Increasing the epochs:

After running many models on the train data, and validation set, I realize that increasing the epochs improves the training accuracy but runs the risk of the model being overfit on the training data because the validation set accuracy does not improve with the training data.


Adding layer and nodes:

Adding one hidden layer and nodes to the hidden layer improved the predictive accuracy.


Upsampling/Downsampling:

There was also attempts to upsample and downsample the target group data to see if it would improve the accuracy of the model. Both only improved the model's accuracy to approx. 0.5. Due to the emphasis on the minority group, accuracy would not likely improve drastically by upsampling/downsampling. As a result, upsampling and downsampling were abandoned for better practices, like NN hyperparameter tuning, that could cause a more noteworthy improvement in the accuracy metric.

IMPROVED MODEL

In [ ]:
#We are changing from normalizer to StandardScaler because the results were better. This is also answering question 5 about normalizing the train and test data.

from sklearn.preprocessing import MinMaxScaler,StandardScaler
sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.transform(X_test)
In [ ]:
#Make sure we understand the shape of the data so that we know what to put into the input layer for dimensions.
print(X_train.shape)
print(y_train.shape)
(8000, 12)
(8000,)
In [ ]:
# Implementing a new framework for the model and adding the necessary layers for the improved model.
model=Sequential()
model.add(Dense(180,input_shape=(12,)))             # 12 input dimensions.
model.add(Dense(100,activation='sigmoid'))
model.add(Dense(1,activation='sigmoid'))            #Sigmoid activation function because this is a binary classification problem.
In [ ]:
#Change the loss function from the initial model to binary_crossentropy because this is a binary classification problem.
model.compile(optimizer='adam',metrics=['accuracy'],loss='binary_crossentropy')
In [ ]:
#Train the model, and use the validation set to ensure of no overfitting.
model.fit(X_train,y_train,batch_size=60,validation_split=0.1,epochs=100)
Train on 7200 samples, validate on 800 samples
Epoch 1/100
7200/7200 [==============================] - 1s 123us/sample - loss: 0.4527 - accuracy: 0.8018 - val_loss: 0.4237 - val_accuracy: 0.8175
Epoch 2/100
7200/7200 [==============================] - 0s 53us/sample - loss: 0.4314 - accuracy: 0.8093 - val_loss: 0.4093 - val_accuracy: 0.8225
Epoch 3/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.4227 - accuracy: 0.8169 - val_loss: 0.3945 - val_accuracy: 0.8263
Epoch 4/100
7200/7200 [==============================] - 0s 56us/sample - loss: 0.4118 - accuracy: 0.8260 - val_loss: 0.3775 - val_accuracy: 0.8438
Epoch 5/100
7200/7200 [==============================] - 0s 52us/sample - loss: 0.4013 - accuracy: 0.8346 - val_loss: 0.3712 - val_accuracy: 0.8512
Epoch 6/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.3928 - accuracy: 0.8379 - val_loss: 0.3627 - val_accuracy: 0.8550
Epoch 7/100
7200/7200 [==============================] - 0s 54us/sample - loss: 0.3831 - accuracy: 0.8413 - val_loss: 0.3613 - val_accuracy: 0.8487
Epoch 8/100
7200/7200 [==============================] - 0s 56us/sample - loss: 0.3765 - accuracy: 0.8443 - val_loss: 0.3542 - val_accuracy: 0.8537
Epoch 9/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.3703 - accuracy: 0.8474 - val_loss: 0.3508 - val_accuracy: 0.8550
Epoch 10/100
7200/7200 [==============================] - 0s 56us/sample - loss: 0.3676 - accuracy: 0.8494 - val_loss: 0.3445 - val_accuracy: 0.8612
Epoch 11/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.3622 - accuracy: 0.8554 - val_loss: 0.3429 - val_accuracy: 0.8562
Epoch 12/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.3608 - accuracy: 0.8536 - val_loss: 0.3399 - val_accuracy: 0.8550
Epoch 13/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.3581 - accuracy: 0.8531 - val_loss: 0.3378 - val_accuracy: 0.8512
Epoch 14/100
7200/7200 [==============================] - 0s 54us/sample - loss: 0.3550 - accuracy: 0.8562 - val_loss: 0.3395 - val_accuracy: 0.8600
Epoch 15/100
7200/7200 [==============================] - 0s 54us/sample - loss: 0.3507 - accuracy: 0.8562 - val_loss: 0.3375 - val_accuracy: 0.8550
Epoch 16/100
7200/7200 [==============================] - 0s 54us/sample - loss: 0.3479 - accuracy: 0.8569 - val_loss: 0.3308 - val_accuracy: 0.8625
Epoch 17/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.3449 - accuracy: 0.8612 - val_loss: 0.3343 - val_accuracy: 0.8600
Epoch 18/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.3422 - accuracy: 0.8593 - val_loss: 0.3222 - val_accuracy: 0.8562
Epoch 19/100
7200/7200 [==============================] - 0s 61us/sample - loss: 0.3405 - accuracy: 0.8607 - val_loss: 0.3273 - val_accuracy: 0.8537
Epoch 20/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.3388 - accuracy: 0.8619 - val_loss: 0.3220 - val_accuracy: 0.8625
Epoch 21/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.3376 - accuracy: 0.8640 - val_loss: 0.3233 - val_accuracy: 0.8612
Epoch 22/100
7200/7200 [==============================] - 0s 63us/sample - loss: 0.3360 - accuracy: 0.8626 - val_loss: 0.3217 - val_accuracy: 0.8637
Epoch 23/100
7200/7200 [==============================] - 1s 74us/sample - loss: 0.3338 - accuracy: 0.8642 - val_loss: 0.3188 - val_accuracy: 0.8662
Epoch 24/100
7200/7200 [==============================] - 1s 72us/sample - loss: 0.3336 - accuracy: 0.8632 - val_loss: 0.3229 - val_accuracy: 0.8687
Epoch 25/100
7200/7200 [==============================] - 0s 63us/sample - loss: 0.3347 - accuracy: 0.8631 - val_loss: 0.3213 - val_accuracy: 0.8612
Epoch 26/100
7200/7200 [==============================] - 0s 66us/sample - loss: 0.3311 - accuracy: 0.8629 - val_loss: 0.3226 - val_accuracy: 0.8712
Epoch 27/100
7200/7200 [==============================] - 0s 63us/sample - loss: 0.3294 - accuracy: 0.8656 - val_loss: 0.3169 - val_accuracy: 0.8712
Epoch 28/100
7200/7200 [==============================] - 0s 65us/sample - loss: 0.3281 - accuracy: 0.8654 - val_loss: 0.3200 - val_accuracy: 0.8687
Epoch 29/100
7200/7200 [==============================] - 0s 66us/sample - loss: 0.3276 - accuracy: 0.8660 - val_loss: 0.3211 - val_accuracy: 0.8675
Epoch 30/100
7200/7200 [==============================] - 0s 62us/sample - loss: 0.3268 - accuracy: 0.8650 - val_loss: 0.3227 - val_accuracy: 0.8687
Epoch 31/100
7200/7200 [==============================] - 0s 68us/sample - loss: 0.3253 - accuracy: 0.8687 - val_loss: 0.3206 - val_accuracy: 0.8687
Epoch 32/100
7200/7200 [==============================] - 1s 73us/sample - loss: 0.3250 - accuracy: 0.8681 - val_loss: 0.3232 - val_accuracy: 0.8612
Epoch 33/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.3238 - accuracy: 0.8681 - val_loss: 0.3188 - val_accuracy: 0.8700
Epoch 34/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.3226 - accuracy: 0.8676 - val_loss: 0.3208 - val_accuracy: 0.8675
Epoch 35/100
7200/7200 [==============================] - 0s 54us/sample - loss: 0.3221 - accuracy: 0.8707 - val_loss: 0.3232 - val_accuracy: 0.8712
Epoch 36/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.3208 - accuracy: 0.8689 - val_loss: 0.3187 - val_accuracy: 0.8675
Epoch 37/100
7200/7200 [==============================] - 0s 56us/sample - loss: 0.3188 - accuracy: 0.8701 - val_loss: 0.3253 - val_accuracy: 0.8687
Epoch 38/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.3190 - accuracy: 0.8693 - val_loss: 0.3241 - val_accuracy: 0.8687
Epoch 39/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.3167 - accuracy: 0.8679 - val_loss: 0.3232 - val_accuracy: 0.8650
Epoch 40/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.3163 - accuracy: 0.8694 - val_loss: 0.3253 - val_accuracy: 0.8650
Epoch 41/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.3151 - accuracy: 0.8710 - val_loss: 0.3205 - val_accuracy: 0.8675
Epoch 42/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.3131 - accuracy: 0.8715 - val_loss: 0.3223 - val_accuracy: 0.8700
Epoch 43/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.3123 - accuracy: 0.8718 - val_loss: 0.3204 - val_accuracy: 0.8675
Epoch 44/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.3115 - accuracy: 0.8732 - val_loss: 0.3215 - val_accuracy: 0.8737
Epoch 45/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.3099 - accuracy: 0.8735 - val_loss: 0.3243 - val_accuracy: 0.8763
Epoch 46/100
7200/7200 [==============================] - 0s 54us/sample - loss: 0.3092 - accuracy: 0.8733 - val_loss: 0.3200 - val_accuracy: 0.8737
Epoch 47/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.3076 - accuracy: 0.8736 - val_loss: 0.3250 - val_accuracy: 0.8637
Epoch 48/100
7200/7200 [==============================] - 0s 61us/sample - loss: 0.3061 - accuracy: 0.8760 - val_loss: 0.3250 - val_accuracy: 0.8637
Epoch 49/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.3056 - accuracy: 0.8743 - val_loss: 0.3237 - val_accuracy: 0.8700
Epoch 50/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.3040 - accuracy: 0.8765 - val_loss: 0.3217 - val_accuracy: 0.8687
Epoch 51/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.3032 - accuracy: 0.8757 - val_loss: 0.3255 - val_accuracy: 0.8675
Epoch 52/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.3023 - accuracy: 0.8767 - val_loss: 0.3262 - val_accuracy: 0.8675
Epoch 53/100
7200/7200 [==============================] - 0s 54us/sample - loss: 0.3005 - accuracy: 0.8765 - val_loss: 0.3276 - val_accuracy: 0.8662
Epoch 54/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.3000 - accuracy: 0.8788 - val_loss: 0.3263 - val_accuracy: 0.8625
Epoch 55/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.2984 - accuracy: 0.8774 - val_loss: 0.3261 - val_accuracy: 0.8700
Epoch 56/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.2975 - accuracy: 0.8774 - val_loss: 0.3291 - val_accuracy: 0.8650
Epoch 57/100
7200/7200 [==============================] - 0s 56us/sample - loss: 0.2958 - accuracy: 0.8781 - val_loss: 0.3320 - val_accuracy: 0.8650
Epoch 58/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.2942 - accuracy: 0.8786 - val_loss: 0.3280 - val_accuracy: 0.8662
Epoch 59/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.2933 - accuracy: 0.8808 - val_loss: 0.3278 - val_accuracy: 0.8700
Epoch 60/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.2924 - accuracy: 0.8813 - val_loss: 0.3305 - val_accuracy: 0.8712
Epoch 61/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.2912 - accuracy: 0.8792 - val_loss: 0.3310 - val_accuracy: 0.8662
Epoch 62/100
7200/7200 [==============================] - 0s 51us/sample - loss: 0.2894 - accuracy: 0.8821 - val_loss: 0.3347 - val_accuracy: 0.8612
Epoch 63/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.2885 - accuracy: 0.8829 - val_loss: 0.3337 - val_accuracy: 0.8700
Epoch 64/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.2867 - accuracy: 0.8819 - val_loss: 0.3374 - val_accuracy: 0.8625
Epoch 65/100
7200/7200 [==============================] - 0s 54us/sample - loss: 0.2870 - accuracy: 0.8808 - val_loss: 0.3327 - val_accuracy: 0.8675
Epoch 66/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.2850 - accuracy: 0.8833 - val_loss: 0.3351 - val_accuracy: 0.8675
Epoch 67/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.2836 - accuracy: 0.8829 - val_loss: 0.3363 - val_accuracy: 0.8650
Epoch 68/100
7200/7200 [==============================] - 0s 54us/sample - loss: 0.2828 - accuracy: 0.8821 - val_loss: 0.3333 - val_accuracy: 0.8700
Epoch 69/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.2815 - accuracy: 0.8843 - val_loss: 0.3362 - val_accuracy: 0.8662
Epoch 70/100
7200/7200 [==============================] - 0s 56us/sample - loss: 0.2805 - accuracy: 0.8833 - val_loss: 0.3384 - val_accuracy: 0.8662
Epoch 71/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.2798 - accuracy: 0.8836 - val_loss: 0.3359 - val_accuracy: 0.8625
Epoch 72/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.2780 - accuracy: 0.8840 - val_loss: 0.3414 - val_accuracy: 0.8662
Epoch 73/100
7200/7200 [==============================] - 0s 54us/sample - loss: 0.2761 - accuracy: 0.8869 - val_loss: 0.3412 - val_accuracy: 0.8687
Epoch 74/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.2752 - accuracy: 0.8874 - val_loss: 0.3385 - val_accuracy: 0.8700
Epoch 75/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.2742 - accuracy: 0.8874 - val_loss: 0.3398 - val_accuracy: 0.8687
Epoch 76/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.2731 - accuracy: 0.8849 - val_loss: 0.3429 - val_accuracy: 0.8662
Epoch 77/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.2718 - accuracy: 0.8864 - val_loss: 0.3445 - val_accuracy: 0.8687
Epoch 78/100
7200/7200 [==============================] - 0s 56us/sample - loss: 0.2706 - accuracy: 0.8868 - val_loss: 0.3431 - val_accuracy: 0.8700
Epoch 79/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.2691 - accuracy: 0.8871 - val_loss: 0.3442 - val_accuracy: 0.8650
Epoch 80/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.2678 - accuracy: 0.8883 - val_loss: 0.3500 - val_accuracy: 0.8575
Epoch 81/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.2672 - accuracy: 0.8876 - val_loss: 0.3457 - val_accuracy: 0.8637
Epoch 82/100
7200/7200 [==============================] - 0s 53us/sample - loss: 0.2666 - accuracy: 0.8897 - val_loss: 0.3484 - val_accuracy: 0.8662
Epoch 83/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.2646 - accuracy: 0.8903 - val_loss: 0.3498 - val_accuracy: 0.8637
Epoch 84/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.2643 - accuracy: 0.8922 - val_loss: 0.3494 - val_accuracy: 0.8662
Epoch 85/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.2627 - accuracy: 0.8911 - val_loss: 0.3532 - val_accuracy: 0.8550
Epoch 86/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.2617 - accuracy: 0.8936 - val_loss: 0.3526 - val_accuracy: 0.8650
Epoch 87/100
7200/7200 [==============================] - 0s 56us/sample - loss: 0.2598 - accuracy: 0.8914 - val_loss: 0.3541 - val_accuracy: 0.8650
Epoch 88/100
7200/7200 [==============================] - 0s 56us/sample - loss: 0.2590 - accuracy: 0.8925 - val_loss: 0.3559 - val_accuracy: 0.8612
Epoch 89/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.2580 - accuracy: 0.8935 - val_loss: 0.3547 - val_accuracy: 0.8662
Epoch 90/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.2573 - accuracy: 0.8946 - val_loss: 0.3546 - val_accuracy: 0.8675
Epoch 91/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.2559 - accuracy: 0.8946 - val_loss: 0.3625 - val_accuracy: 0.8612
Epoch 92/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.2550 - accuracy: 0.8961 - val_loss: 0.3598 - val_accuracy: 0.8650
Epoch 93/100
7200/7200 [==============================] - 0s 60us/sample - loss: 0.2550 - accuracy: 0.8958 - val_loss: 0.3580 - val_accuracy: 0.8612
Epoch 94/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.2532 - accuracy: 0.8956 - val_loss: 0.3590 - val_accuracy: 0.8625
Epoch 95/100
7200/7200 [==============================] - 0s 55us/sample - loss: 0.2520 - accuracy: 0.8944 - val_loss: 0.3575 - val_accuracy: 0.8650
Epoch 96/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.2517 - accuracy: 0.8947 - val_loss: 0.3597 - val_accuracy: 0.8587
Epoch 97/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.2500 - accuracy: 0.8994 - val_loss: 0.3647 - val_accuracy: 0.8587
Epoch 98/100
7200/7200 [==============================] - 0s 59us/sample - loss: 0.2495 - accuracy: 0.8969 - val_loss: 0.3615 - val_accuracy: 0.8650
Epoch 99/100
7200/7200 [==============================] - 0s 58us/sample - loss: 0.2479 - accuracy: 0.8976 - val_loss: 0.3634 - val_accuracy: 0.8600
Epoch 100/100
7200/7200 [==============================] - 0s 57us/sample - loss: 0.2473 - accuracy: 0.8985 - val_loss: 0.3590 - val_accuracy: 0.8662
Out[ ]:
<tensorflow.python.keras.callbacks.History at 0x7fe98361a668>

One can see that the accuracy of the model with the training data improved to 0.9, and the loss was minimal. However, the validation accuracy score remains at 0.87. The model is performing well and is not too overfit.

In [ ]:
loss, acc = model.evaluate(X_test, y_test, verbose=0)
print('Accuracy: %.3f'  % acc)
print('Loss: %.3f' % loss)
Accuracy: 0.859
Loss: 0.359

The accuracy for the test data is pretty close to the train and validation score. So, the model is pretty decent when it comes to its predictions.

7. Predict the results using 0.5 as a threshold (10 points)

In [ ]:
#Predicting results with a 0.5 threshold.
pred=model.predict(X_test)
pred=np.where(pred>0.5,1,0)      #Setting the threshold to 0.5
pred                              #Making predictions.
Out[ ]:
array([[0],
       [0],
       [0],
       ...,
       [0],
       [0],
       [0]])

8. Print the Accuracy score and confusion matrix (5 points)

In [ ]:
from sklearn.metrics import accuracy_score, confusion_matrix, precision_score, recall_score, f1_score, precision_recall_curve

accuracy_score(y_test,pred)
Out[ ]:
0.8595

The accuracy score of the test data ended up being 0.86, with a loss of 0.36. It is fairly strong. Observing the results from the training and validation process, we can see that the model is not too overfit. The accuracy for the training data ended up being 0.90 (loss = 0.25), while the validation data accuracy was 0.87 (0.36). So, the test data did not bomb too badly.

In [ ]:
#Let's plot the confusion matrix and other related metrics for the test data.
#Please scroll all the way down to see the metrics and confusion matrix for this model!

import matplotlib as plt
y_pred_cls = model.predict_classes(X_test, batch_size= 200, verbose = 0)
print('Accuracy_Model (Dropout): ' + str(model.evaluate(X_test,y_test)[1]))
print('Recall_Score:' + str(recall_score(y_test, y_pred_cls)))
print('Precision_Score:' + str(precision_score(y_test, y_pred_cls)))
print('F-score' + str(f1_score(y_test, y_pred_cls)))
confusion_matrix(y_test, y_pred_cls)
2000/1 [================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 35us/sample - loss: 0.2500 - accuracy: 0.8595
Accuracy_Model (Dropout): 0.8595
Recall_Score:0.5604938271604938
Precision_Score:0.6878787878787879
F-score0.617687074829932
Out[ ]:
array([[1492,  103],
       [ 178,  227]])
In [ ]:
#Making confusion matrix look better.

confmatrix=pd.DataFrame(confusion_matrix(y_test,pred))
confmatrix.index=['Actual_0','Actual_1']
confmatrix.columns=['Predicted_0','Predicted_1']
confmatrix
Out[ ]:
Predicted_0 Predicted_1
Actual_0 1492 103
Actual_1 178 227

Accuracy is mostly reliant on the correctly predicted cases. Of course, this metric's success is mostly due to those customers that were predicted to not leave and actually did not leave. However, this metric is not the best for understanding misclassification.

Precision and recall are better metrics for determining how great this model is in predicting customers' churning behavior because the minority class is the highlight of these metrics.

Precision: True Positive/True Positive + False Positive. 69% of those who were predicted to leave actually left.

Recall: True Positive/True Positive + False Negative. 56% of those that actually left were predicted to leave.

These metrics show that this model is not that great at understanding the customer base's churning behavior, despite its high accuracy in predicting the people that would not leave. Considering that we used very robust activation functions that have strong squashing abilities to handle outliers, and loss functions that are equipped to handle uneven data without having to upsample or downsample the target variable, the neural network model actually performed pretty well.

The neural network performs well because of the components, and the focused details that go into the model. However, with bank data, other traditional machine learning models may be better. Models like random forest, gradient boost, and even logistic regression may be better solutions for predicting banking behavior, and business earning potential.

Neural networks seem to have higher performance in image recognition than it does with traditional banking questions.

In [ ]: